Predicting the Time Until a Vehicle Changes the Lane Using LSTM-Based Recurrent Neural Networks
نویسندگان
چکیده
To plan safe and comfortable trajectories for automated vehicles on highways, accurate predictions of traffic situations are needed. So far, a lot research effort has been spent detecting lane change maneuvers rather than estimating the point in time actually happens. In practice, however, this temporal information might be even more useful. This paper deals with development system that accurately predicts to next surrounding highways using long short-term memory-based recurrent neural networks. An extensive evaluation based large real-world data set shows our approach is able make reliable predictions, most challenging situations, root mean squared error around 0.7 seconds. Already 3.5 seconds prior changes become highly accurate, showing median less 0.25 summary, article forms fundamental step towards downstreamed position predictions.
منابع مشابه
Phenotyping of Clinical Time Series with LSTM Recurrent Neural Networks
We present a novel application of LSTM recurrent neural networks to multilabel classification of diagnoses given variable-length time series of clinical measurements. Our method outperforms a strong baseline on a variety of metrics.
متن کاملDefinition Extraction with LSTM Recurrent Neural Networks
Definition extraction is the task to identify definitional sentences automatically from unstructured text. The task can be used in the aspects of ontology generation, relation extraction and question answering. Previous methods use handcraft features generated from the dependency structure of a sentence. During this process, only part of the dependency structure is used to extract features, thu...
متن کاملAbstract Meaning Representation Parsing using LSTM Recurrent Neural Networks
We present a system which parses sentences into Abstract Meaning Representations, improving state-of-the-art results for this task by more than 5%. AMR graphs represent semantic content using linguistic properties such as semantic roles, coreference, negation, and more. The AMR parser does not rely on a syntactic preparse, or heavily engineered features, and uses five recurrent neural networks ...
متن کاملTTS synthesis with bidirectional LSTM based recurrent neural networks
Feed-forward, Deep neural networks (DNN)-based text-tospeech (TTS) systems have been recently shown to outperform decision-tree clustered context-dependent HMM TTS systems [1, 4]. However, the long time span contextual effect in a speech utterance is still not easy to accommodate, due to the intrinsic, feed-forward nature in DNN-based modeling. Also, to synthesize a smooth speech trajectory, th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE robotics and automation letters
سال: 2021
ISSN: ['2377-3766']
DOI: https://doi.org/10.1109/lra.2021.3058930